Name | Version | Summary | date |
forgellm |
0.3.7 |
A comprehensive toolkit for end-to-end continued pre-training, fine-tuning, monitoring, testing and publishing of language models with MLX-LM |
2025-07-09 00:17:29 |
evopt |
0.14.3 |
User Friendly Data-Driven Numerical Optimization |
2025-03-18 21:03:10 |
arbor-ai |
0.1.3 |
A framework for fine-tuning and managing language models |
2025-03-09 19:00:44 |
elora |
0.0.3 |
eLoRA: Efficient Rank Allocation for Budget-Constrained Fine-Tuning 🧮💰⚙️🎛️ |
2025-03-01 20:31:11 |
optimum-neuron |
0.0.28 |
Optimum Neuron is the interface between the Hugging Face Transformers and Diffusers libraries and AWS Trainium and Inferentia accelerators. It provides a set of tools enabling easy model loading, training and inference on single and multiple neuron core settings for different downstream tasks. |
2025-02-07 10:24:33 |
datadreamer.dev |
0.46.0 |
Prompt. Generate Synthetic Data. Train & Align Models. |
2025-02-02 21:20:05 |
data-prep-toolkit |
0.2.3.post1 |
Data Preparation Toolkit Library for Ray and Python |
2025-01-28 17:41:19 |
data-prep-toolkit-transforms |
1.0.0 |
Data Preparation Toolkit Transforms using Ray |
2025-01-24 20:49:37 |
quicktunetool |
0.0.3 |
A Framework for Efficient Model Selection and Hyperparameter Optimization |
2025-01-09 09:50:44 |
optimum-habana |
1.15.0 |
Optimum Habana is the interface between the Hugging Face Transformers and Diffusers libraries and Habana's Gaudi processor (HPU). It provides a set of tools enabling easy model loading, training and inference on single- and multi-HPU settings for different downstream tasks. |
2024-12-25 19:38:22 |
finetuning-scheduler |
2.5.0 |
A PyTorch Lightning extension that enhances model experimentation with flexible fine-tuning schedules. |
2024-12-20 19:13:50 |
fimm |
0.0.3 |
Finetune PyTorch Image Models with TIMM |
2024-12-16 11:15:56 |
data-prep-toolkit-lang |
1.0.0a0 |
Data Preparation Toolkit Transforms using Ray |
2024-12-12 15:54:19 |
data-prep-connector |
0.2.3 |
Scalable and Compliant Web Crawler |
2024-11-22 16:39:59 |
optimum-tpu |
0.2.0 |
Optimum TPU is the interface between the Hugging Face Transformers library and Google Cloud TPU devices. |
2024-11-20 13:07:21 |
functioncalming |
0.0.9 |
Robust and reliable OpenAI function calling |
2024-11-13 12:42:05 |
prebuilt-RAG-LU |
1.0.4 |
A library for building Retrieval-Augmented Generation (RAG) systems using ChromaDB and popular language models (LLMs). |
2024-10-28 07:22:36 |
FineTune-Information-Extractor-for-NLPTasks-based-mBART |
1.0.7 |
A library for fine-tuning mBART models to perform information extraction for various NLP tasks. |
2024-10-24 12:57:11 |
FineTune-Information-Extractor-for-NLPTasks-based-T5-Small |
1.0.5 |
A library for fine-tuning T5-small models to perform information extraction for various NLP tasks. |
2024-10-15 08:22:22 |
data-prep-toolkit-transforms-lang1 |
0.2.2 |
Data Preparation Toolkit Transforms |
2024-09-29 12:14:49 |